Generalized damped Newton algorithms in nonsmooth optimization via second-order subdifferentials

نویسندگان

چکیده

The paper proposes and develops new globally convergent algorithms of the generalized damped Newton type for solving important classes nonsmooth optimization problems. These are based on theory calculations second-order subdifferentials functions with employing machinery variational analysis differentiation. First we develop a superlinearly Newton-type algorithm class continuously differentiable Lipschitzian gradients, which second order. Then design such to solve structured quadratic composite problems extended-real-valued cost functions, typically arise in machine learning statistics. Finally, present results numerical experiments compare performance our main applied an Lasso those achieved by other first-order algorithms.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonsmooth optimization via quasi-Newton methods

We investigate the behavior of quasi-Newton algorithms applied to minimize a nonsmooth function f , not necessarily convex. We introduce an inexact line search that generates a sequence of nested intervals containing a set of points of nonzero measure that satisfy the Armijo and Wolfe conditions if f is absolutely continuous along the line. Furthermore, the line search is guaranteed to terminat...

متن کامل

Second-order nonsmooth optimization for H ∞ synthesis

The standard way to compute H∞ feedback controllers uses algebraic Riccati equations and is therefore of limited applicability. Here we present a new approach to the H∞ output feedback control design problem, which is based on nonlinear and nonsmooth mathematical programming techniques. Our approach avoids the use of Lyapunov variables, and is therefore flexible in many practical situations.

متن کامل

A Quasi-Newton Approach to Nonsmooth Convex Optimization A Quasi-Newton Approach to Nonsmooth Convex Optimization

We extend the well-known BFGS quasi-Newton method and its limited-memory variant (LBFGS) to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: The local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We apply the resulting subLBFGS algorithm to L2-reg...

متن کامل

Partial second-order subdifferentials of -prox-regular functions

Although prox-regular functions in general are nonconvex, they possess properties that one would expect to find in convex or lowerC2  functions. The class of prox-regular functions covers all convex functions, lower C2  functions and strongly amenable functions. At first, these functions have been identified in finite dimension using proximal subdifferential. Then, the definition of prox-regula...

متن کامل

Control via Nonsmooth Optimization ∗

We present a new approach to mixed H 2 /H∞ output feedback control synthesis. Our method uses nonsmooth mathematical programming techniques to compute locally optimal H 2 /H∞-controllers, which may have a predefined structure. We prove global convergence of our method and present tests to validate it numerically. 1. Introduction. Mixed H 2 /H ∞ output feedback control is a prominent example of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Global Optimization

سال: 2022

ISSN: ['1573-2916', '0925-5001']

DOI: https://doi.org/10.1007/s10898-022-01248-7